Information geometry of divergence functions
نویسندگان
چکیده
Measures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, KullbackLeibler divergence and f -divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class of f -divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. The f -divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class of f -divergences. This is unique, sitting at the intersection of the f -divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallis q-entropy and related divergences are also addressed.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملExponentially concave functions and a new information geometry
Abstract. A function is exponentially concave if its exponential is concave. We consider exponentially concave functions on the unit simplex. It is known that gradient maps of exponentially concave functions are solutions of a MongeKantorovich optimal transport problem and allow for a better gradient approximation than those of ordinary concave functions. The approximation error, called L-diver...
متن کاملDifferential Geometry Derived from Divergence Functions: Information Geometry Approach
We study differential-geometrical structure of an information manifold equipped with a divergence function. A divergence function generates a Riemannian metric and furthermore it provides a symmetric third-order tensor, when the divergence is asymmetric. This induces a pair of affine connections dually coupled to each other with respect to the Riemannian metric. This is the arising emerged from...
متن کاملReference Duality and Representation Duality in Information Geometry
Classical information geometry prescribes, on the parametric family of probability functions Mθ : (i) a Riemannian metric given by the Fisher information; (ii) a pair of dual connections (giving rise to the family of α-connections) that preserve the metric under parallel transport by their joint actions; and (iii) a family of (non-symmetric) divergence functions (α-divergence) defined on Mθ ×Mθ...
متن کاملNonparametric Information Geometry: From Divergence Function to Referential-Representational Biduality on Statistical Manifolds
Divergence functions are the non-symmetric “distance” on the manifold,Mθ, of parametric probability density functions over a measure space, (X,μ). Classical information geometry prescribes, on Mθ: (i) a Riemannian metric given by the Fisher information; (ii) a pair of dual connections (giving rise to the family of α-connections) that preserve the metric under parallel transport by their joint a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010